51 research outputs found
OCRAPOSE II: An OCR-based indoor positioning system using mobile phone images
In this paper, we propose an OCR (optical character recognition)-based
localization system called OCRAPOSE II, which is applicable in a number of
indoor scenarios including office buildings, parkings, airports, grocery
stores, etc. In these scenarios, characters (i.e. texts or numbers) can be used
as suitable distinctive landmarks for localization. The proposed system takes
advantage of OCR to read these characters in the query still images and
provides a rough location estimate using a floor plan. Then, it finds depth and
angle-of-view of the query using the information provided by the OCR engine in
order to refine the location estimate. We derive novel formulas for the query
angle-of-view and depth estimation using image line segments and the OCR box
information. We demonstrate the applicability and effectiveness of the proposed
system through experiments in indoor scenarios. It is shown that our system
demonstrates better performance compared to the state-of-the-art benchmarks in
terms of location recognition rate and average localization error specially
under sparse database condition.Comment: 14 pages, 22 Figure
On the Rates of Convergence in Learning of Optimal Temporally Fair Schedulers
Multi-user schedulers are designed to achieve optimal average system utility
(e.g. throughput) subject to a set of fairness criteria. In this work,
scheduling under temporal fairness constraints is considered. Prior works have
shown that a class of scheduling strategies called threshold based strategies
(TBSs) achieve optimal system utility under temporal fairness constraints. The
optimal TBS thresholds are determined as a function of the channel statistics.
In order to provide performance guarantees for TBSs in practical scenarios ---
where the scheduler learns the optimal thresholds based on the empirical
observations of the channel realizations --- it is necessary to evaluate the
rates of convergence of TBS thresholds to the optimal value. In this work,
these rates of convergence and the effect on the resulting system utility are
investigated. It is shown that the best estimate of the threshold vector is at
least away from the optimal value, where is
the number of observations of the independent and identically distributed
channel realizations. Furthermore, it is shown that under long-term fairness
constraints, the scheduler may achieve an average utility that is higher than
the optimal long-term utility by violating the fairness criteria for a long
initial period. Consequently, the resulting system utility may converge to its
optimal long-term value from above. The results are verified by providing
simulations of practical scheduling scenarios
Real time ridge orientation estimation for fingerprint images
Fingerprint verification is an important bio-metric technique for personal
identification. Most of the automatic verification systems are based on
matching of fingerprint minutiae. Extraction of minutiae is an essential
process which requires estimation of orientation of the lines in an image. Most
of the existing methods involve intense mathematical computations and hence are
performed through software means. In this paper a hardware scheme to perform
real time orientation estimation is presented which is based on pipelined
architecture. Synthesized circuits proved the functionality and accuracy of the
suggested method.Comment: 8 pages, 15 figures, 1 tabl
Opportunistic Temporal Fair Mode Selection and User Scheduling for Full-duplex Systems
In-band full-duplex (FD) communications - enabled by recent advances in
antenna and RF circuit design - has emerged as one of the promising techniques
to improve data rates in wireless systems. One of the major roadblocks in
enabling high data rates in FD systems is the inter-user interference (IUI) due
to activating pairs of uplink and downlink users at the same time-frequency
resource block. Opportunistic user scheduling has been proposed as a means to
manage IUI and fully exploit the multiplexing gains in FD systems. In this
paper, scheduling under long-term and short-term temporal fairness for
single-cell FD wireless networks is considered. Temporal fair scheduling is of
interest in delay-sensitive applications, and leads to predictable latency and
power consumption. The feasible region of user temporal demand vectors is
derived, and a scheduling strategy maximizing the system utility while
satisfying long-term temporal fairness is proposed. Furthermore, a short-term
temporal fair scheduling strategy is devised which satisfies user temporal
demands over a finite window-length. It is shown that the strategy achieves
optimal average system utility as the window-length is increased
asymptotically. Subsequently, practical construction algorithms for long-term
and short-term temporal fair scheduling are introduced. Simulations are
provided to verify the derivations and investigate the multiplexing gains. It
is observed that using successive interference cancellation at downlink users
improves FD gains significantly in the presence of strong IUI
1D Modeling of Sensor Selection Problem for Weak Barrier Coverage and Gap Mending in Wireless Sensor Networks
In this paper, we first remodel the line coverage as a 1D discrete problem
with co-linear targets. Then, an order-based greedy algorithm, called OGA, is
proposed to solve the problem optimally. It will be shown that the existing
order in the 1D modeling, and especially the resulted Markov property of the
selected sensors can help design greedy algorithms such as OGA. These
algorithms demonstrate optimal/efficient performance and have lower complexity
compared to the state-of-the-art. Furthermore, it is demonstrated that the
conventional continuous line coverage problem can be converted to an equivalent
discrete problem and solved optimally by OGA. Next, we formulate the well-known
weak barrier coverage problem as an instance of the continuous line coverage
problem (i.e. a 1D problem) as opposed to the conventional 2D graph-based
models. We demonstrate that the equivalent discrete version of this problem can
be solved optimally and faster than the state-of-the-art methods using an
extended version of OGA, called K-OGA. Moreover, an efficient local algorithm,
called LOGM, is proposed to mend barrier gaps due to sensor failure. In the
case of m gaps, LOGM is proved to select at most 2m-1 sensors more than the
optimal while being local and implementable in distributed fashion. We
demonstrate the optimal/efficient performance of the proposed algorithms via
extensive simulations.Comment: 10 Pages, 11 Figure
Hierarchical Watermarking Framework Based on Analysis of Local Complexity Variations
Increasing production and exchange of multimedia content has increased the
need for better protection of copyright by means of watermarking. Different
methods have been proposed to satisfy the tradeoff between imperceptibility and
robustness as two important characteristics in watermarking while maintaining
proper data-embedding capacity. Many watermarking methods use image independent
set of parameters. Different images possess different potentials for robust and
transparent hosting of watermark data. To overcome this deficiency, in this
paper we have proposed a new hierarchical adaptive watermarking framework. At
the higher level of hierarchy, complexity of an image is ranked in comparison
with complexities of images of a dataset. For a typical dataset of images, the
statistical distribution of block complexities is found. At the lower level of
the hierarchy, for a single cover image that is to be watermarked, complexities
of blocks can be found. Local complexity variation (LCV) among a block and its
neighbors is used to adaptively control the watermark strength factor of each
block. Such local complexity analysis creates an adaptive embedding scheme,
which results in higher transparency by reducing blockiness effects. This two
level hierarchy has enabled our method to take advantage of all image blocks to
elevate the embedding capacity while preserving imperceptibility. For testing
the effectiveness of the proposed framework, contourlet transform (CT) in
conjunction with discrete cosine transform (DCT) is used to embed pseudo-random
binary sequences as watermark. Experimental results show that the proposed
framework elevates the performance the watermarking routine in terms of both
robustness and transparency.Comment: 12 pages, 14 figures, 8 table
On the Fundamental Limits of Multi-user Scheduling under Short-term Fairness Constraints
In the conventional information theoretic analysis of multiterminal
communication scenarios, it is often assumed that all of the distributed
terminals use the communication channel simultaneously. However, in practical
wireless communication systems - due to restricted computation complexity at
network terminals - a limited number of users can be activated either in uplink
or downlink simultaneously. This necessitates the design of a scheduler which
determines the set of active users at each time-slot. A well designed scheduler
maximizes the average system utility subject to a set of fairness criteria,
which must be met in a limited window-length to avoid long starvation periods.
In this work, scheduling under short-term temporal fairness constraints is
considered. The objective is to maximize the average system utility such that
the fraction of the time-slots that each user is activated is within desired
upper and lower bounds in the fairness window-length. The set of feasible
window-lengths is characterized as a function of system parameters. It is shown
that the optimal system utility is non-monotonic and super-additive in
window-length. Furthermore, a scheduling strategy is proposed which satisfies
short-term fairness constraints for arbitrary window-lengths, and achieves
optimal average system utility as the window-length is increased
asymptotically. Numerical simulations are provided to verify the results
Subjective and Objective Quality Assessment of Image: A Survey
With the increasing demand for image-based applications, the efficient and
reliable evaluation of image quality has increased in importance. Measuring the
image quality is of fundamental importance for numerous image processing
applications, where the goal of image quality assessment (IQA) methods is to
automatically evaluate the quality of images in agreement with human quality
judgments. Numerous IQA methods have been proposed over the past years to
fulfill this goal. In this paper, a survey of the quality assessment methods
for conventional image signals, as well as the newly emerged ones, which
includes the high dynamic range (HDR) and 3-D images, is presented. A
comprehensive explanation of the subjective and objective IQA and their
classification is provided. Six widely used subjective quality datasets, and
performance measures are reviewed. Emphasis is given to the full-reference
image quality assessment (FR-IQA) methods, and 9 often-used quality measures
(including mean squared error (MSE), structural similarity index (SSIM),
multi-scale structural similarity index (MS-SSIM), visual information fidelity
(VIF), most apparent distortion (MAD), feature similarity measure (FSIM),
feature similarity measure for color images (FSIMC), dynamic range independent
measure (DRIM), and tone-mapped images quality index (TMQI)) are carefully
described, and their performance and computation time on four subjective
quality datasets are evaluated. Furthermore, a brief introduction to 3-D IQA is
provided and the issues related to this area of research are reviewed.Comment: 50 pages, 12 figures, and 3 Tables. This work has been submitted to
Elsevier Journal of Visual Communication and Image Representatio
A fast semi-automatic method for classification and counting the number and types of blood cells in an image
A novel and fast semi-automatic method for segmentation, locating and
counting blood cells in an image is proposed. In this method, thresholding is
used to separate the nucleus from the other parts. We also use Hough transform
for circles to locate the center of white cells. Locating and counting of red
cells is performed using template matching. We make use of finding local
maxima, labeling and mean value computation in order to shrink the areas
obtained after applying Hough transform or template matching, to a single pixel
as representative of location of each region. The proposed method is very fast
and computes the number and location of white cells accurately. It is also
capable of locating and counting the red cells with a small error
Hardware Implementation of Adaptive Watermarking Based on Local Spatial Disorder Analysis
With the increasing use of the internet and the ease of exchange of
multimedia content, the protection of ownership rights has become a significant
concern. Watermarking is an efficient means for this purpose. In many
applications, real-time watermarking is required, which demands hardware
implementation of low complexity and robust algorithm. In this paper, an
adaptive watermarking is presented, which uses embedding in different
bit-planes to achieve transparency and robustness. Local disorder of pixels is
analyzed to control the strength of the watermark. A new low complexity method
for disorder analysis is proposed, and its hardware implantation is presented.
An embedding method is proposed, which causes lower degradation in the
watermarked image. Also, the performance of proposed watermarking architecture
is improved by a pipe-line structure and is tested on an FPGA device. Results
show that the algorithm produces transparent and robust watermarked images. The
synthesis report from FPGA implementation illustrates a low complexity hardware
structure.Comment: 16 pages, 6 figure
- …